Training a Spiking Neural Network to Generate Online Handwriting Movements

نویسندگان

  • Mahmoud Ltaief
  • Hala Bezine
  • Adel M. Alimi
چکیده

In this paper we developed a spiking neural network model that learns to generate online handwriting movements. The architecture is a feed forward network with one hidden layer. The input layer uses a set of Beta elliptic parameters. The hidden layer contains both excitatory and inhibitory neurons. Whereas the output layer provides the script coordinates x(t) and y(t). The proposed spiking neural network has been trained according to Sander Bohet model. The trained spiking neural network has been successfully tested on MAYASTROUN data base. Also a comparative stady between the proposed spiking neural network and an artificial neural network proposed in a previous work is established.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sequence learning with hidden units in spiking neural networks

We consider a statistical framework in which recurrent networks of spiking neurons learn to generate spatio-temporal spike patterns. Given biologically realistic stochastic neuronal dynamics we derive a tractable learning rule for the synaptic weights towards hidden and visible neurons that leads to optimal recall of the training sequences. We show that learning synaptic weights towards hidden ...

متن کامل

Modeling of directional operations in the motor cortex: A noisy network of spiking neurons is trained to generate a neural-vector trajectory

-A fully connected network o f spiking neurons modeling motor cortical directional operations is presented and analyzed. The model allows for the basic biological requirements stemming from the results o f experimental studies. The dynamical evolution o f the network's output is interpreted as the sequential generation o f neuronal population vectors representing the combined directional tenden...

متن کامل

Generating Sequences With Recurrent Neural Networks

This paper shows how Long Short-term Memory recurrent neural networks can be used to generate complex sequences with long-range structure, simply by predicting one data point at a time. The approach is demonstrated for text (where the data are discrete) and online handwriting (where the data are real-valued). It is then extended to handwriting synthesis by allowing the network to condition its ...

متن کامل

Accelerating Recurrent Neural Network Training

An efficient algorithm for recurrent neural network training is presented. The approach increases the training speed for tasks where a length of the input sequence may vary significantly. The proposed approach is based on the optimal batch bucketing by input sequence length and data parallelization on multiple graphical processing units. The baseline training performance without sequence bucket...

متن کامل

The effect of large training set sizes on online Japanese Kanji and English cursive recognizers

Much research in handwriting recognition has focused on how to improve recognizers with constrained training set sizes. This paper presents the results of training a nearest-neighbor based online Japanese Kanji recognizer and a neural-network based online cursive English recognizer on a wide range of training set sizes, including sizes not generally available. The experiments demonstrate that i...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016